An implementation of Newton's method for Keating's potential optimization problems

نویسندگان

  • Anton Anikin
  • Alexander Gornov
چکیده

A modification of Newton’s method for solving multidimensional problems of Keating’s potential optimization is addressed. The proposed approach is based on two main futures: information about the special structure of Hessian matrix and sparse matrix technology. Developed data structures for store sparse Hessian matrix and modification of the linear conjugate method for the Hessian matrices with non-positive determinant are considered. The results of computational experiments and comparable tests are presented.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modify the linear search formula in the BFGS method to achieve global convergence.

<span style="color: #333333; font-family: Calibri, sans-serif; font-size: 13.3333px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: justify; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; background-color: #ffffff; text-dec...

متن کامل

Newton's Method for Large Bound-Constrained Optimization Problems

We analyze a trust region version of Newton’s method for bound-constrained problems. Our approach relies on the geometry of the feasible set, not on the particular representation in terms of constraints. The convergence theory holds for linearly constrained problems and yields global and superlinear convergence without assuming either strict complementarity or linear independence of the active ...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

An Efficient Newton-like Method for Molecular Mechanics

Techniques from numerical analysis and crystallographic refinement have been combined to produce a variant of the Truncated Newton nonlinear optimization procedure. The new algorithm shows particular promise for potential energy minimization of large molecular systems. Usual implementations of Newton's method require storage space proportional to the number of atoms squared (i.e., O(N ')) and c...

متن کامل

Tensor Methods for Large, Sparse Unconstrained Optimization

Tensor methods for unconstrained optimization were rst introduced by Schn-abel and Chow SIAM J. Optimization, 1 (1991), pp. 293-315], who describe these methods for small to moderate-size problems. The major contribution of this paper is the extension of these methods to large, sparse unconstrained optimization problems. This extension requires an entirely new way of solving the tensor model th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Stud. Inform. Univ.

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2011